115 research outputs found

    New Coherence and RIP Analysis for Weak Orthogonal Matching Pursuit

    Full text link
    In this paper we define a new coherence index, named the global 2-coherence, of a given dictionary and study its relationship with the traditional mutual coherence and the restricted isometry constant. By exploring this relationship, we obtain more general results on sparse signal reconstruction using greedy algorithms in the compressive sensing (CS) framework. In particular, we obtain an improved bound over the best known results on the restricted isometry constant for successful recovery of sparse signals using orthogonal matching pursuit (OMP).Comment: arXiv admin note: substantial text overlap with arXiv:1307.194

    High Order Methods for a Class of Volterra Integral Equations with Weakly Singular Kernels

    Get PDF
    The solution of the Volterra integral equation, (βˆ—)x(t)=g1(t)+tg2(t)+∫0tK(t,s,x(s))tβˆ’sds,0≦t≦T, ( * )\qquad x(t) = g_1 (t) + \sqrt {t}g_2 (t) + \int _0^t \frac {K(t,s,x(s))} {\sqrt {t - s} } ds, \quad 0 \leqq t \leqq T, where g1(t)g_1 (t), g2(t)g_2 (t) and K(t,s,x)K(t,s,x) are smooth functions, can be represented as x(t)=u(t)+tv(t)x(t) = u(t) + \sqrt {t}v(t) ,0≦t≦T0 \leqq t \leqq T, where u(t)u(t), v(t)v(t) are, smooth and satisfy a system of Volterra integral equations. In this paper, numerical schemes for the solution of (*) are suggested which calculate x(t)x(t) via u(t)u(t), v(t)v(t) in a neighborhood of the origin and use (*) on the rest of the interval 0≦t≦T0 \leqq t \leqq T. In this way, methods of arbitrarily high order can be derived. As an example, schemes based on the product integration analogue of Simpson's rule are treated in detail. The schemes are shown to be convergent of order h7/2h^{{7 / 2}} . Asymptotic error estimates are derived in order to examine the numerical stability of the methods

    A Note on Error Bounds for Pseudo Skeleton Approximations of Matrices

    Full text link
    Due to their importance in both data analysis and numerical algorithms, low rank approximations have recently been widely studied. They enable the handling of very large matrices. Tight error bounds for the computationally efficient Gaussian elimination based methods (skeleton approximations) are available. In practice, these bounds are useful for matrices with singular values which decrease quickly. Using the Chebyshev norm, this paper provides improved bounds for the errors of the matrix elements. These bounds are substantially better in the practically relevant cases where the eigenvalues decrease polynomially. Results are proven for general real rectangular matrices. Even stronger bounds are obtained for symmetric positive definite matrices. A simple example is given, comparing these new bounds to earlier ones.Comment: 8 pages, 1 figur
    • …
    corecore